25 research outputs found

    Intrinsic Volumes of Polyhedral Cones: A combinatorial perspective

    Get PDF
    The theory of intrinsic volumes of convex cones has recently found striking applications in areas such as convex optimization and compressive sensing. This article provides a self-contained account of the combinatorial theory of intrinsic volumes for polyhedral cones. Direct derivations of the General Steiner formula, the conic analogues of the Brianchon-Gram-Euler and the Gauss-Bonnet relations, and the Principal Kinematic Formula are given. In addition, a connection between the characteristic polynomial of a hyperplane arrangement and the intrinsic volumes of the regions of the arrangement, due to Klivans and Swartz, is generalized and some applications are presented.Comment: Survey, 23 page

    Gordon's inequality and condition numbers in conic optimization

    Full text link
    The probabilistic analysis of condition numbers has traditionally been approached from different angles; one is based on Smale's program in complexity theory and features integral geometry, while the other is motivated by geometric functional analysis and makes use of the theory of Gaussian processes. In this note we explore connections between the two approaches in the context of the biconic homogeneous feasiblity problem and the condition numbers motivated by conic optimization theory. Key tools in the analysis are Slepian's and Gordon's comparision inequalities for Gaussian processes, interpreted as monotonicity properties of moment functionals, and their interplay with ideas from conic integral geometry

    Effective Condition Number Bounds for Convex Regularization

    Get PDF
    We derive bounds relating Renegar's condition number to quantities that govern the statistical performance of convex regularization in settings that include the â„“1\ell_1-analysis setting. Using results from conic integral geometry, we show that the bounds can be made to depend only on a random projection, or restriction, of the analysis operator to a lower dimensional space, and can still be effective if these operators are ill-conditioned. As an application, we get new bounds for the undersampling phase transition of composite convex regularizers. Key tools in the analysis are Slepian's inequality and the kinematic formula from integral geometry.Comment: 17 pages, 4 figures . arXiv admin note: text overlap with arXiv:1408.301

    Robust Smoothed Analysis of a Condition Number for Linear Programming

    Full text link
    We perform a smoothed analysis of the GCC-condition number C(A) of the linear programming feasibility problem \exists x\in\R^{m+1} Ax < 0. Suppose that \bar{A} is any matrix with rows \bar{a_i} of euclidean norm 1 and, independently for all i, let a_i be a random perturbation of \bar{a_i} following the uniform distribution in the spherical disk in S^m of angular radius \arcsin\sigma and centered at \bar{a_i}. We prove that E(\ln C(A)) = O(mn / \sigma). A similar result was shown for Renegar's condition number and Gaussian perturbations by Dunagan, Spielman, and Teng [arXiv:cs.DS/0302011]. Our result is robust in the sense that it easily extends to radially symmetric probability distributions supported on a spherical disk of radius \arcsin\sigma, whose density may even have a singularity at the center of the perturbation. Our proofs combine ideas from a recent paper of B\"urgisser, Cucker, and Lotz (Math. Comp. 77, No. 263, 2008) with techniques of Dunagan et al.Comment: 34 pages. Version 3: only cosmetic change

    Effective condition number bounds for convex regularization

    Get PDF
    We derive bounds relating Renegar's condition number to quantities that govern the statistical performance of convex regularization in settings that include the â„“ 1 -analysis setting. Using results from conic integral geometry, we show that the bounds can be made to depend only on a random projection, or restriction, of the analysis operator to a lower dimensional space, and can still be effective if these operators are ill-conditioned. As an application, we get new bounds for the undersampling phase transition of composite convex regularizers. Key tools in the analysis are Slepian's inequality and the kinematic formula from integral geometry
    corecore